Leave-one-out Bounds for Support Vector Regression Model Selection

نویسندگان

  • Ming-Wei Chang
  • Chih-Jen Lin
چکیده

Abstract Minimizing bounds of leave-one-out (loo) errors is an important and efficient approach for support vector machine (SVM) model selection. Past research focuses on their use for classification but not regression. In this article, we derive various loo bounds for support vector regression (SVR) and discuss the difference from those for classification. Experiments demonstrate that the proposed bounds are competitive with Bayesian SVR for parameter selection. We also discuss the differentiability of loo bounds.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Meta Learning: Learning to Predict the Leave–one–out Error

We propose a meta learning framework, casting leave-one-out (LOO) error approximation into a classification problem. For Support Vector Machines this means that we need to learn a classification of whether or not a given Support Vector – if left out of the data set – would be misclassified. For this learning task, simple data set dependent features are proposed, inspired by bounds from learning...

متن کامل

Radius-Margin Bound on the Leave-One-Out Error of a M-SVM

Using a support vector machine (SVM) requires to set the values of two types of hyperparameters: the soft margin parameter C and the parameters of the kernel. To perform this model selection task, the method of choice is cross-validation. Its leave-one-out variant is known to produce an estimator of the generalization error which is almost unbiased. Its major drawback rests in its time requirem...

متن کامل

Fast exact leave-one-out cross-validation of sparse least-squares support vector machines

Leave-one-out cross-validation has been shown to give an almost unbiased estimator of the generalisation properties of statistical models, and therefore provides a sensible criterion for model selection and comparison. In this paper we show that exact leave-one-out cross-validation of sparse Least-Squares Support Vector Machines (LS-SVMs) can be implemented with a computational complexity of on...

متن کامل

Model Selection for the l2-SVM by Following the Regularization Path

For a support vector machine, model selection consists in selecting the kernel function, the values of its parameters, and the amount of regularization. To set the value of the regularization parameter, one can minimize an appropriate objective function over the regularization path. A priori, this requires the availability of two elements: the objective function and an algorithm computing the r...

متن کامل

Leave-One-Out Bound for Crammer-Singer Multiclass Support Vector Machine∗

The selection of parameters in the support vector machine (SVM) is an important step for constructing a high performance learning machine. Minimizing the bound of leave-one-out (LOO) error is an efficient and time-economized approach for the SVM to select parameters. In fact, some famous bounds have been proposed. These researches focus on their issues for binary classification but not multicla...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004